Burg Matrix Divergence-Based Hierarchical Distance Metric Learning for Binary Classification
نویسندگان
چکیده
منابع مشابه
Large Scale Metric Learning for Distance-Based Image Classification
This paper studies large-scale image classification, in a setting where new classes and training images could continuously be added at (near) zero cost. We cast this problem into one of learning a lowrank metric, which is shared across all classes and explore k-nearest neighbor (k-NN) and nearest class mean (NCM) classifiers. We also introduce an extension of the NCM classifier to allow for ric...
متن کاملHierarchical Metric Learning for Fine Grained Image Classification
This paper deals with the problem of fine-grained image classification and introduces the notion of hierarchical metric learning for the same. It is indeed challenging to categorize fine-grained image classes merely in terms of a single level classifier given the subtle inter-class visual differences. In order to tackle this problem, we propose a two stage framework where i) the image categorie...
متن کاملImage-to-Class Distance Metric Learning for Image Classification
Image-To-Class (I2C) distance is first used in Naive-Bayes Nearest-Neighbor (NBNN) classifier for image classification and has successfully handled datasets with large intra-class variances. However, the performance of this distance relies heavily on the large number of local features in the training set and test image, which need heavy computation cost for nearest-neighbor (NN) search in the t...
متن کاملConvex Optimizations for Distance Metric Learning and Pattern Classification
The goal of machine learning is to build automated systems that can classify and recognize complex patterns in data. Not surprisingly, the representation of the data plays an important role in determining what types of patterns can be automatically discovered. Many algorithms for machine learning assume that the data are represented as elements in a metric space. For example, in popular algorit...
متن کاملDistance Metric Learning for Large Margin Nearest Neighbor Classification
We show how to learn aMahanalobis distance metric for k-nearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven data sets of varying size and difficulty, we find that metrics trained in this way lead to signif...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2017
ISSN: 2169-3536
DOI: 10.1109/access.2017.2669203